Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Maximum Storage Capacity of the Hopfield Model

Recurrent neural networks (RNN) have traditionally been of great interest for their capacity to store memories. In past years, several works have been devoted to determine the maximum storage capacity of RNN, especially for the case of the Hopfield network, the most popular kind of RNN. Analyzing the thermodynamic limit of the statistical properties of the Hamiltonian corresponding to the Hopfi...

متن کامل

Saturated Linear Recurrent Neural Networks with Maximum Capacity

We consider discrete and recurrent neural network models with saturated linear neurons. A de nition of capacity is discussed and conditions that assure in nite capacity are established. The aim of this paper is to study networks with maximum capacity and to what extend maximal capacity relates to the network connecting weights.

متن کامل

Storage Capacity of Letter Recognition in Hopfield Networks

Associative memory is a dynamical system which has a number of stable states with a domain of attraction around them [1]. If the system starts at any state in the domain, it will converge to the locally stable state, which is called an attractor. In 1982, Hopfield [2] proposed a fully connected neural network model of associative memory in which patterns can be stored by distributed among neuro...

متن کامل

Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study

The Hopfield model is a pioneering neural network model with associative memory retrieval. The analytical solution of the model in mean field limit revealed that memories can be retrieved without any error up to a finite storage capacity of O(N), where N is the system size. Beyond the threshold, they are completely lost. Since the introduction of the Hopfield model, the theory of neural network...

متن کامل

Capacity and Trainability in Recurrent Neural Networks

Two potential bottlenecks on the expressiveness of recurrent neural networks (RNNs) are their ability to store information about the task in their parameters, and to store information about the input history in their units. We show experimentally that all common RNN architectures achieve nearly the same per-task and per-unit capacity bounds with careful training, for a variety of tasks and stac...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Entropy

سال: 2019

ISSN: 1099-4300

DOI: 10.3390/e21080726